Nonlinear Conjugate Gradient Coefficients with Exact and Strong Wolfe Line Searches Techniques

نویسندگان

چکیده

Nonlinear conjugate gradient (CG) methods are very important for solving unconstrained optimization problems. These have been subjected to extensive researches in terms of enhancing them. Exact and strong Wolfe line search techniques usually used practice the analysis implementation methods. For better results, several studies carried out modify classical CG The method Fletcher Reeves (FR) is one most well-known It has convergence properties, but it gives poor numerical results practice. main goal this paper enhance performance via a convexity type modification on its coefficient β k . We ensure that with modification, still achieving sufficient descent condition global both exact searches. show modified FR more robust effective.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

and Applied Analysis 3 = ‖d k−1 ‖2 ‖g k−1 ‖4 + 1 ‖g k ‖2 − β2 k (gT k d k−1 ) 2 /‖g k ‖4

متن کامل

A Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization

In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...

متن کامل

A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search

In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild condi...

متن کامل

New hybrid conjugate gradient methods with the generalized Wolfe line search

The conjugate gradient method was an efficient technique for solving the unconstrained optimization problem. In this paper, we made a linear combination with parameters β k of the DY method and the HS method, and putted forward the hybrid method of DY and HS. We also proposed the hybrid of FR and PRP by the same mean. Additionally, to present the two hybrid methods, we promoted the Wolfe line s...

متن کامل

Exploiting damped techniques for nonlinear conjugate gradient methods

In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi–Newton methods. We extend their use to NCG methods in large scale unconstrained optimization, aiming at possibly improving the efficiency and the robustness of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Mathematics

سال: 2022

ISSN: ['2314-4785', '2314-4629']

DOI: https://doi.org/10.1155/2022/1383129